Predicting Visual Intelligibility Gain in the SynFace Application
نویسندگان
چکیده
When the acoustic signal is weak or disturbed, normal hearing people as well as hearing impaired make use of visual information as a support for their interpretation of the speech. Thus, a talking head could be of great support for the hearing impaired when engaged in speech activities where no visual information is available, such as a telephone conversation or when listening to an audio book. SynFace, a talking head developed at the Royal Institute of Technology, is an application where the lip movements are driven by the acoustic signal and synchronized with the speech. Thus, it could be a valuable support since it provides sufficient visual information. The purpose of this project is to find an error metric that could be used to predict the visual intelligibility gain when using the SynFace application. The metric is based on visual similarity among Swedish vowels and consonants, respectively. The perceptual distances were established in an experiment where normal hearing subjects watched silent movie clips where nonsense words were presented by a synthetic as well as a natural face. Error metrics were calculated for three different networks using frame-by-frame comparison, where the perceptual distances were used as weights for the phoneme recognition errors. The calculated error metrics were then mapped to intelligibility in regard to SRT (Speech Reception Threshold) reduction in noise. A linear relationship between the performance of the network and the SRT levels was found. Visuell uppfattbarhetsprediktion i SynFace
منابع مشابه
Experiment with asynchrony in multimodal speech communication
The purpose of this study was to examine the delay effects in audiovisual speech perception for natural and synthetic faces. The main focus was on the SYNFACE project, the development of a telephone communication aid for hearing impaired persons. In the experiments, the consequence of temporal displacement of the audio in relation to the visual channel was investigated. The audio channel was na...
متن کاملLipreadability of a synthetic talking face in normal hearing and hearing-impaired listeners
The Synface project is developing a synthetic talking face to aid the hearing-impaired in telephone conversation. This report investigates the gain in intelligibility from the synthetic talking head when controlled by hand-annotated speech in both 12 normal hearing (NH) and 13 hearing-impaired (HI) listeners (average hearing loss 86 dB). For NH listeners, audio from everyday sentences was degra...
متن کاملThis is a placeholder. Final title will be filled later
The Synface project is developing a synthetic talking face to aid the hearing impaired in telephone conversation. This report investigates the gain in intelligibility from the synthetic talking head when controlled by hand-annotated speech. Audio from Swedish, English and Dutch sentences was degraded to simulate the information losses that arise in severe-to-profound hearing impairment. 12 norm...
متن کاملSynFace - Speech-Driven Facial Animation for Virtual Speech-Reading Support
This paper describes SynFace, a supportive technology that aims at enhancing audio-based spoken communication in adverse acoustic conditions by providing the missing visual information in the form of an animated talking head. Firstly, we describe the system architecture, consisting of a 3D animated face model controlled from the speech input by a specifically optimised phonetic recogniser. Seco...
متن کاملSpeaker separation using visual speech features and single-channel audio
This work proposes a method of single-channel speaker separation that uses visual speech information to extract a target speaker’s speech from a mixture of speakers. The method requires a single audio input and visual features extracted from the mouth region of each speaker in the mixture. The visual information from speakers is used to create a visually-derived Wiener filter. The Wiener filter...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2009